Divergence measures based on the Shannon entropy

نویسنده

  • Jianhua Lin
چکیده

A new class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The new measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Goodness of Fit Test For Exponentiality Based on Lin-Wong Information

In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...

متن کامل

Extended MULTIMOORA method based on Shannon entropy weight for materials selection

Selection of appropriate material is a crucial step in engineering design and manufacturing process. Without a systematic technique, many useful engineering materials may be ignored for selection. The category of multiple attribute decision-making (MADM) methods is an effective set of structured techniques. Having uncomplicated assumptions and mathematics, the MULTIMOORA method as an MADM appro...

متن کامل

Extropy: Complementary Dual of Entropy

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon’s entropy function has a complementary dual function which we call “extropy.” The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of d...

متن کامل

Shannon Shakes Hands with Chernoff: Big Data Viewpoint On Channel Information Measures

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Rényi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables an...

متن کامل

Measures of trajectory ensemble disparity in nonequilibrium statistical dynamics

Many interesting divergence measures between conjugate ensembles of nonequilibrium trajectories can be experimentally determined from the work distribution of the process. Herein, we review the statistical and physical significance of several of these measures, in particular the relative entropy (dissipation), Jeffreys divergence (hysteresis), Jensen–Shannon divergence (timeasymmetry), Chernoff...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 37  شماره 

صفحات  -

تاریخ انتشار 1991